Hyper-Parameter Tuning for Graph Kernels via Multiple Kernel Learning
نویسندگان
چکیده
Kernelized learning algorithms have seen a steady growth in popularity during the last decades. The procedure to estimate the performances of these kernels in real applications is typical computationally demanding due to the process of hyper-parameter selection. This is especially true for graph kernels, which are computationally quite expensive. In this paper, we study an approach that substitutes the commonly adopted procedure for kernel hyper-parameter selection by a multiple kernel learning procedure that learns a linear combination of kernel matrices obtained by the same kernel with different values for the hyperparameters. Empirical results on real-world graph datasets show that the proposed methodology is faster than the baseline method when the number of parameter configurations is large, while always maintaining comparable and in some cases superior performances.
منابع مشابه
یادگیری نیمه نظارتی کرنل مرکب با استفاده از تکنیکهای یادگیری معیار فاصله
Distance metric has a key role in many machine learning and computer vision algorithms so that choosing an appropriate distance metric has a direct effect on the performance of such algorithms. Recently, distance metric learning using labeled data or other available supervisory information has become a very active research area in machine learning applications. Studies in this area have shown t...
متن کاملNeural Network-Based Learning Kernel for Automatic Segmentation of Multiple Sclerosis Lesions on Magnetic Resonance Images
Background: Multiple Sclerosis (MS) is a degenerative disease of central nervous system. MS patients have some dead tissues in their brains called MS lesions. MRI is an imaging technique sensitive to soft tissues such as brain that shows MS lesions as hyper-intense or hypo-intense signals. Since manual segmentation of these lesions is a laborious and time consuming task, automatic segmentation ...
متن کاملA Comparison Study of Nonlinear Kernels
Compared to linear kernel, nonlinear kernels can often substantially improve the accuracies of many machine learning algorithms. In this paper, we compare 5 different nonlinear kernels: minmax, RBF, fRBF (folded RBF), acos, and acos-χ, on a wide range of publicly available datasets. The proposed fRBF kernel performs very similarly to the RBF kernel. Both RBF and fRBF kernels require an importan...
متن کاملInformation theoretic graph kernels
This thesis addresses the problems that arise in state-of-the-art structural learning methods for (hyper)graph classification or clustering, particularly focusing on developing novel information theoretic kernels for graphs. To this end, we commence in Chapter 3 by defining a family of Jensen-Shannon diffusion kernels, i.e., the information theoretic kernels, for (un)attributed graphs. We show ...
متن کاملKernel Graph Convolutional Neural Networks
Graph kernels have been successfully applied to many graph classification problems. Typically, a kernel is first designed, and then an SVM classifier is trained based on the features defined implicitly by this kernel. This two-stage approach decouples data representation from learning, which is suboptimal. On the other hand, Convolutional Neural Networks (CNNs) have the capability to learn thei...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2016